📚 Chapters

Design & Analysis of Algorithms

Unit 1 - Understanding Fundamentals

📖 Full Notes
⚡ Last Min Notes
🎯 Chapter 1 — Algorithm and Program Performance
📊 ALGORITHM PERFORMANCE - MIND MAP
Algorithm → Step-by-step procedure to solve a problem
Time Complexity → How running time grows with input size
Space Complexity → How memory usage grows with input size
Analysis Types → Best Case, Average Case, Worst Case
Asymptotic Notations → Big O, Omega, Theta (growth comparison)
Recurrence → Substitution, Recursion Tree, Master Method
1. What is an Algorithm?
Definition:
An algorithm is a step-by-step procedure or set of rules designed to solve a specific problem or perform a specific task. Think of it as a recipe for solving a problem.
Real-life Example - Making Tea:
Step 1: Boil water
Step 2: Add tea leaves
Step 3: Add sugar
Step 4: Add milk
Step 5: Strain and serve

This is an algorithm for making tea! Similarly, algorithms in programming solve computational problems.
Characteristics of a Good Algorithm:
1. Input: Algorithm should have zero or more inputs (data we provide)
2. Output: Must produce at least one output (result)
3. Definiteness: Each step must be clear and unambiguous (no confusion)
4. Finiteness: Must terminate after finite number of steps (should not run forever)
5. Effectiveness: Each step must be simple enough to be executed
6. Correctness: Should produce correct output for all valid inputs
Example Algorithm - Find Maximum of Two Numbers: Algorithm: FindMaximum Input: Two numbers a and b Output: Maximum number Step 1: Start Step 2: Read values of a and b Step 3: If a > b then max = a Else max = b Step 4: Print max Step 5: Stop
Algorithm vs Program:
Algorithm: Language-independent solution (written in plain English or pseudocode)
Program: Implementation of algorithm in a specific programming language (Java, C++, Python)
2. Why Study Algorithms?
✓ Efficiency: Helps write faster programs (less time)
✓ Optimization: Reduces memory usage (less space)
✓ Problem Solving: Improves logical thinking and coding skills
✓ Scalability: Ensures program works well with large data
✓ Career: Important for interviews and competitive programming
Real-life Impact:
• Google Search uses algorithms to find results in milliseconds from billions of web pages
• GPS uses algorithms to find shortest route
• Netflix uses algorithms to recommend shows
• Banks use algorithms for fraud detection
3. Time Complexity
What is Time Complexity?
Time complexity is a measure of the amount of time an algorithm takes to run as a function of the input size. It tells us how the running time grows when the input size increases.

Important: We don't measure actual time in seconds (that depends on computer speed). Instead, we count the number of operations performed.

Simple Example - Printing Numbers:

Algorithm 1: Print numbers 1 to n
for (i = 1 to n)
  print i

Operations: n times (one print for each number)
Time Complexity: O(n) - grows linearly with input

Algorithm 2: Print all pairs
for (i = 1 to n)
  for (j = 1 to n)
    print (i, j)

Operations: n × n = n² times
Time Complexity: O(n²) - grows quadratically
Common Time Complexities (from fastest to slowest):
Time Complexity Hierarchy
📊 Time Complexity Hierarchy
Remember:
• Lower time complexity = Faster algorithm
• We always analyze for large input sizes (as n → ∞)
• We ignore constants (2n and 5n both are O(n))
4. Space Complexity
What is Space Complexity?
Space complexity is the amount of memory space required by an algorithm to run as a function of the input size. It includes both auxiliary space and space used by input.
Space Complexity = Auxiliary Space + Input Space
Example 1 - Sum of Array Elements:
int sum = 0;
for (i = 0 to n-1)
  sum = sum + arr[i]


Variables used: sum, i (only 2 variables)
Space Complexity: O(1) - constant space
(Space doesn't grow with input size)
Example 2 - Copy Array:
Create new array temp[n]
for (i = 0 to n-1)
  temp[i] = arr[i]


Extra array of size n created
Space Complexity: O(n) - linear space
(Space grows with input size)
Time vs Space Trade-off:
Sometimes we can make algorithm faster by using more memory, or save memory by taking more time. This is called time-space trade-off.
5. Average and Worst Case Analysis
What is Case Analysis?
Case analysis means analyzing an algorithm's performance under different scenarios. We typically analyze three cases: Best Case, Average Case, and Worst Case.
1. Best Case Analysis
Best Case: The scenario where the algorithm performs the minimum number of operations. This is the most optimistic situation.
Example - Linear Search:
Task: Find element 'x' in array [10, 20, 30, 40, 50]
If x = 10 (first element), we find it in just 1 comparison.
Best Case Time: O(1) - constant time
2. Average Case Analysis
Average Case: The expected performance when considering all possible inputs. We calculate the average number of operations across all scenarios.
Example - Linear Search:
Array: [10, 20, 30, 40, 50]
Element could be at any position (or not present)
Average comparisons = (1 + 2 + 3 + 4 + 5) / 5 = 3
Average Case Time: O(n) - linear time
3. Worst Case Analysis
Worst Case: The scenario where the algorithm performs the maximum number of operations. This is the most pessimistic situation and most commonly used for analysis.
Example - Linear Search:
Array: [10, 20, 30, 40, 50]
If x = 50 (last element) or x is not present, we check all n elements.
Worst Case Time: O(n) - linear time
Complete Asymptotic Notations Comparison
📊 Complete Asymptotic Notations Comparison
Why Worst Case is Important:
• Guarantees algorithm won't perform worse than this
• Useful for critical systems (medical, aviation, banking)
• Provides upper bound on running time
• Most commonly used in analysis
Detailed Example - Insertion Sort:

Best Case: Array is already sorted [1, 2, 3, 4, 5]
→ Only n-1 comparisons needed → O(n)

Average Case: Array is randomly ordered
→ Average of n²/4 comparisons → O(n²)

Worst Case: Array is reverse sorted [5, 4, 3, 2, 1]
→ Maximum comparisons: 1+2+3+...+(n-1) = n(n-1)/2 → O(n²)
6. Asymptotic Notations
What are Asymptotic Notations?
Asymptotic notations are mathematical tools used to describe the running time of an algorithm when the input size approaches infinity. They help us compare algorithms by ignoring constant factors and focusing on growth rate.

Why "Asymptotic"? We analyze how the algorithm behaves as input size (n) becomes very large (tends to infinity).

Three Main Asymptotic Notations:
1. Big O (O) - Upper Bound (Worst Case)
2. Big Omega (Ω) - Lower Bound (Best Case)
3. Big Theta (Θ) - Tight Bound (Average Case)
7. Big O Notation (Upper Bound)
Big O Notation:
Big O describes the upper bound of an algorithm's running time. It gives the worst-case scenario - the maximum time an algorithm could take.

Think of it as: "The algorithm will take AT MOST this much time"
f(n) = O(g(n)) If there exist positive constants c and n₀ such that: f(n) ≤ c × g(n) for all n ≥ n₀
Simple Explanation:
If an algorithm takes 3n² + 5n + 2 operations:
• For large n, n² term dominates
• We ignore constants (3, 5, 2)
• We ignore lower order terms (5n, 2)
• Big O = O(n²)

This means the algorithm won't take more than n² time (ignoring constants).
Big O, Omega, Theta Comparison
📊 Big O, Omega, Theta Comparison
Real Code Example:
for (i = 0; i < n; i++) {
  for (j = 0; j < n; j++) {
    print(i, j);
  }
}


Outer loop: n times
Inner loop: n times for each outer iteration
Total operations: n × n = n²
Big O = O(n²)
Important Properties:
• O(n) + O(n) = O(n) - we keep the higher order
• O(n) × O(n) = O(n²) - we multiply
• O(2n) = O(n) - constants are dropped
• O(n² + n) = O(n²) - lower order terms dropped
8. Big Omega Notation (Ω) - Lower Bound
Big Omega Notation:
Big Omega describes the lower bound of an algorithm's running time. It gives the best-case scenario - the minimum time an algorithm could take.

Think of it as: "The algorithm will take AT LEAST this much time"
f(n) = Ω(g(n)) If there exist positive constants c and n₀ such that: f(n) ≥ c × g(n) for all n ≥ n₀
Example - Linear Search:
Best case: Element found at first position
Time taken: 1 comparison (constant)
Ω(1) - Will take at least constant time

Even in the best case, we must do at least 1 comparison, so Ω(1).
Example - Merge Sort:
Even in the best case (already sorted array), merge sort has to:
• Divide the array: log n levels
• Merge at each level: n operations
Total: n log n operations
Ω(n log n) - Will take at least n log n time
Remember:
• Big O = Upper bound (worst case - maximum time)
• Big Omega = Lower bound (best case - minimum time)
• Big O is more commonly used in practice
9. Big Theta Notation (Θ) - Tight Bound
Big Theta Notation:
Big Theta describes the tight bound of an algorithm's running time. It means the algorithm's running time is bounded both from above and below by the same function.

Think of it as: "The algorithm will take EXACTLY this much time (within constant factors)"
f(n) = Θ(g(n)) If f(n) = O(g(n)) AND f(n) = Ω(g(n)) Then f(n) = Θ(g(n))
Simple Explanation:
If an algorithm has:
• Best case = O(n²)
• Worst case = O(n²)
Then we can say it's Θ(n²) (tight bound)

This means the algorithm always takes around n² time, regardless of input.
Example - Merge Sort:
Best case: Ω(n log n)
Worst case: O(n log n)
Since both are same: Θ(n log n)

Merge sort always takes n log n time, whether array is sorted, reverse sorted, or random.
Example - Simple Loop:
for (i = 0; i < n; i++) {
  print(i);
}


This loop ALWAYS runs exactly n times.
Best case = n, Worst case = n
Θ(n) - tight bound
Examples of Big O Notation
📊 Examples of Big O Notation
10. Asymptotic Notations - Detailed Comparison
Case Analysis Summary for Common Algorithms
📊 Case Analysis Summary for Common Algorithms
Practical Example - Binary Search:

Best Case (Ω):
Element found in middle on first try
Ω(1) - at least constant time

Worst Case (O):
Element at end or not present
Need to divide array log n times
O(log n) - at most logarithmic time

Average Case (Θ):
On average, takes log n comparisons
Θ(log n) - exactly logarithmic time
When to Use Which:
• Use Big O when analyzing worst-case (most common)
• Use Big Omega when proving lower bounds
• Use Big Theta when best and worst cases are same
• In practice, "Big O" is used most frequently
11. Common Mistakes in Complexity Analysis
⚠️ Mistake 1: Not Dropping Constants
❌ Wrong: O(2n) or O(5n²)
✓ Correct: O(n) and O(n²)
Constants are always dropped!

⚠️ Mistake 2: Not Dropping Lower Order Terms
❌ Wrong: O(n² + n)
✓ Correct: O(n²)
Keep only the highest order term!

⚠️ Mistake 3: Confusing Best/Worst Case with Big O/Omega
• Best case can still have Big O notation
• Worst case can still have Big Omega notation
• They are different concepts!

⚠️ Mistake 4: Thinking O(n) is Always Better than O(n²)
For small n, O(n²) with small constant might be faster
Example: O(n²) algorithm with 2n² operations vs O(n) with 1000n operations
For n < 500, the O(n²) is actually faster!
12. Practice: Finding Time Complexity
Example 1: Simple Loop
for (i = 0; i < n; i++) {
  print(i);
}

Loop runs n times → O(n)
Example 2: Nested Loops (Same Limit)
for (i = 0; i < n; i++) {
  for (j = 0; j < n; j++) {
    print(i, j);
  }
}

Outer: n times, Inner: n times → n × n → O(n²)
Example 3: Nested Loops (Different Limits)
for (i = 0; i < n; i++) {
  for (j = 0; j < m; j++) {
    print(i, j);
  }
}

Outer: n times, Inner: m times → n × m → O(n×m)
Example 4: Loop with Division
for (i = n; i > 0; i = i/2) {
  print(i);
}

Each iteration divides by 2
n → n/2 → n/4 → ... → 1
Takes log₂(n) iterations → O(log n)
Example 5: Sequential Code
for (i = 0; i < n; i++) {
  print(i);
}
for (j = 0; j < n; j++) {
  print(j);
}

First loop: O(n), Second loop: O(n)
Total: O(n) + O(n) = O(2n) = O(n)
Example 6: Dependent Nested Loop
for (i = 0; i < n; i++) {
  for (j = 0; j < i; j++) {
    print(i, j);
  }
}

When i=0: 0 times, i=1: 1 time, i=2: 2 times, ..., i=n-1: n-1 times
Total: 0+1+2+...+(n-1) = n(n-1)/2 → O(n²)
🔁 Chapter 2 — Recurrence Equations & Solutions
📊 RECURRENCE - MIND MAP
Recurrence → Function defined using smaller inputs
Substitution → Guess + Prove by induction
Recursion Tree → Draw tree, sum level costs
Master Method → T(n)=aT(n/b)+f(n) → 3 cases